• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

¿µ¹® ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ¿µ¹® ³í¹®Áö > JIPS (Çѱ¹Á¤º¸Ã³¸®ÇÐȸ)

JIPS (Çѱ¹Á¤º¸Ã³¸®ÇÐȸ)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) A Federated Multi-Task Learning Model Based on Adaptive Distributed Data Latent Correlation Analysis
¿µ¹®Á¦¸ñ(English Title) A Federated Multi-Task Learning Model Based on Adaptive Distributed Data Latent Correlation Analysis
ÀúÀÚ(Author) Shengbin Wu   Yibai Wang                             
¿ø¹®¼ö·Ïó(Citation) VOL 17 NO. 03 PP. 0441 ~ 0452 (2021. 06)
Çѱ۳»¿ë
(Korean Abstract)
¿µ¹®³»¿ë
(English Abstract)
Federated learning provides an efficient integrated model for distributed data, allowing the local training of different data. Meanwhile, the goal of multi-task learning is to simultaneously establish models for multiple related tasks, and to obtain the underlying main structure. However, traditional federated multi-task learning models not only have strict requirements for the data distribution, but also demand large amounts of calculation and have slow convergence, which hindered their promotion in many fields. In our work, we apply the rank constraint on weight vectors of the multi-task learning model to adaptively adjust the task¡¯s similarity learning, according to the distribution of federal node data. The proposed model has a general framework for solving optimal solutions, which can be used to deal with various data types. Experiments show that our model has achieved the best results in different dataset. Notably, our model can still obtain stable results in datasets with large distribution differences. In addition, compared with traditional federated multi-task learning models, our algorithm is able to converge on a local optimal solution within limited training iterations.
Å°¿öµå(Keyword) Data Distribution   Federated Multi-Task Learning   Rank Constraint   Underlying Structure                          
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå